Chernoff Bound

Let X1,X2,...,XnX_1,X_2,...,X_n be independent {0,1}\{0,1\}-valued random variables and let pi=𝔼[Xi]p_i = \mathbb{E}[X_i], where 0<pi<10<p_i<1. Then the sum S=j=1nXiS = \sum_{j=1}^n X_i, which has mean μ=j=1npiμ = \sum_{j=1}^n p_i, satisfies

Pr[S(1+ϵ)μ]eϵ2μ2+ϵ\mathrm{Pr}[S \geq (1+\epsilon)\mu] \leq e^{\frac{-\epsilon^2\mu}{2+\epsilon}}

Corollary

Let X1,X2,...,XnX_1,X_2,...,X_n be independent {0,1}\{0,1\}-valued random variables and let pi=𝔼[Xi]p_i = \mathbb{E}[X_i], where 0<pi<10<p_i<1. Let S=j=1nXiS = \sum_{j=1}^n X_i and 𝔼[S]=μ\mathbb{E}[S] = \mu. For ϵ(0,1)\epsilon \in (0,1), Pr[|Sμ|ϵμ]2eϵ2μ/3\mathrm{Pr}[|S-\mu| \geq \epsilon\mu] \leq 2e^{-\epsilon^2\mu/3}


Example of Concentration inequality.

Compare Gaussian tail bound, when k=ϵμk=\epsilon \sqrt{\mu}